Learning First-Order Probabilistic Models with Combining Rules Learning First-Order Probabilistic Models with Combining Rules
نویسندگان
چکیده
Many real-world domains exhibit rich relational structure and stochasticity and motivate the development of models that combine predicate logic with probabilities. These models describe probabilistic influences between attributes of objects that are related to each other through known domain relationships. To keep these models succinct, each such influence is considered independent of others, which is called the assumption of “independence of causal influences” (ICI). In this paper, we describe a language that consists of quantified conditional influence statements and captures most relational probabilistic models based on directed graphs. The influences due to different statements are combined using a set of combining rules such as Noisy-OR. We motivate and introduce multi-level combining rules, where the lower level rules combine the influences due to different ground instances of the same statement, and the upper level rules combine the influences due to different statements. We present algorithms and empirical results for parameter learning in the presence of such combining rules. Specifically, we derive and implement algorithms based on gradient descent and expectation maximization for different combining rules and evaluate them on synthetic data and on a real-world task. The results demonstrate that the algorithms are able to learn both the conditional probability distributions of the influence statements and the parameters of the combining rules.
منابع مشابه
Decision-Driven Models with Probabilistic Soft Logic
We introduce the concept of a decision-driven model, a probabilistic model that reasons directly over the uncertain information of interest to a decision maker. We motivate the use of these models from the perspective of personalized medicine. Decision-driven models have a number of benefits that are of particular value in this domain, such as being easily interpretable and naturally quantifyin...
متن کاملOn the Use of Combining Rules in Relational Probability Trees
A relational probability tree (RPT) is a type of decision tree that can be used for probabilistic classification of instances with a relational structure. Each leaf of an RPT contains a probability model that determines for each class the probability that an instance belongs to that class. The only kind of probability models that have been used in RPTs so far are multinomial probability distrib...
متن کاملRule-based joint fuzzy and probabilistic networks
One of the important challenges in Graphical models is the problem of dealing with the uncertainties in the problem. Among graphical networks, fuzzy cognitive map is only capable of modeling fuzzy uncertainty and the Bayesian network is only capable of modeling probabilistic uncertainty. In many real issues, we are faced with both fuzzy and probabilistic uncertainties. In these cases, the propo...
متن کاملTowards a Toolbox for Relational Probabilistic Knowledge Representation, Reasoning, and Learning
This paper presents KReator, a versatile and easy-to-use toolbox for statistical relational learning currently under development. The research on combining probabilistic models and first-order theory put forth a lot of different approaches in the past few years. While every approach has advantages and disadvantages the variety of prototypical implementations make thorough comparisons of differe...
متن کاملLearning Probabilities for Noisy First-Order Rules
First-order logic is the traditional basis for knowledge representation languages. However, its applicability to many real-world tasks is limited by its inability to represent uncertainty. Bayesian belief networks, on the other hand, are inadequate for complex KR tasks due to the limited expressivity of the underlying (prepositional) language. The need to incorporate uncertainty into an express...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005